Mixtures of Experts Estimate A Posteriori Probabilities

نویسنده

  • Perry Moerland
چکیده

The mixtures of experts (ME) model ooers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classiication problems the minimization of this ME error function leads to ME outputs estimating the a posteriori probabilities of class membership of the input vector.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mixtures of Experts Estimate a Posteriori Probabilities Mixtures of Experts Estimate a Posteriori Probabilities

The mixtures of experts (ME) model ooers a modular structure suitable for a divide-and-conquer approach to pattern recognition. It has a probabilistic interpretation in terms of a mixture model, which forms the basis for the error function associated with MEs. In this paper, it is shown that for classiication problems the minimization of this ME error function leads to ME outputs estimating the...

متن کامل

ALGEBRAIC NONLINEARITY IN VOLTERRA-HAMMERSTEIN EQUATIONS

Here a posteriori error estimate for the numerical solution of nonlinear Voltena- Hammerstein equations is given. We present an error upper bound for nonlinear Voltena-Hammastein integral equations, in which the form of nonlinearity is algebraic and develop a posteriori error estimate for the recently proposed method of Brunner for these problems (the implicitly linear collocation method)...

متن کامل

A Self-organized Multi Agent Decision Making System Based on Fuzzy Probabilities: The Case of Aphasia Diagnosis

Aphasia diagnosis is a challenging medical diagnostic task due to the linguistic uncertainty and vagueness, large number of measurements with imprecision, inconsistencies in the definition of Aphasic syndromes, natural diversity and subjectivity in test objects as well as in options of experts who diagnose the disease. In this paper we present a new self-organized multi agent system that diagno...

متن کامل

Towards EM-style Algorithms for a posteriori Optimization of Normal Mixtures

Expectation maximization (EM) provides a simple and elegant approach to the problem of optimizing the parameters of a normal mixture on an unlabeled dataset. To accomplish this, EM iteratively reweights the elements of the dataset until a locally optimal normal mixture is obtained. This paper explores the intriguing question of whether such an EM-style algorithm exists for the related and appar...

متن کامل

Neural Network Classifiers Estimate Bayesian a posteriori Probabilities

Many neural network classifiers provide outputs which estimate Bayesian a posteriori probabilities. When the estimation is accurate, network outputs can be treated as probabilities and sum to one. Simple proofs show that Bayesian probabilities are estimated when desired network outputs are 2 of M (one output unity, all others zero) and a squarederror or cross-entropy cost function is used. Resu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997